Skip to content

skpkg: get tests temporarily passing for migration #39

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jul 6, 2025

Conversation

dabeycorn
Copy link

@dabeycorn dabeycorn commented Jul 1, 2025

I've gotten the tests to pass most of the time by replacing the expected output with what's actually being output.

@dabeycorn dabeycorn marked this pull request as draft July 1, 2025 04:41
@sbillinge
Copy link
Contributor

you can probably get it to be more robust if you turn the string of numbers into arrays of floats and then use numpy.all_close and degrade the tolerance till it passes. We are making a worse and worse test, but it is better than skpping the test because at least we have a test passing and we want it to still be passing after the migration......

@dabeycorn
Copy link
Author

Sounds good, I'll finish this PR and then finish the rest of the migration process

Copy link

codecov bot commented Jul 5, 2025

Welcome to Codecov 🎉

Once you merge this PR into your default branch, you're all set! Codecov will compare coverage reports and display results in all future pull requests.

Thanks for integrating Codecov - We've got you covered ☂️

@dabeycorn dabeycorn marked this pull request as ready for review July 5, 2025 23:40
This reverts commit eebca9f.

Signed-off-by: Dasun Abeykoon <[email protected]>
@dabeycorn
Copy link
Author

dabeycorn commented Jul 5, 2025

@sbillinge I replaced the assertion statement comparing raw json data with something more lenient. Ready for review.

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is a bit too much going on in this PR. i suggest that maybe we close it and think about writing a new test. It is actually good practice for writing tests.

Step 1 is to figure out what behavior we want that function to have. It is not even clear to me which function is being tested.
Step 2 is to make comments in the test file that capture the inputs and the expected behavior of that function
Step 3 is to then write the test for each case, usually using pytest.mark.paramaterize

Push changes at each step to a PR for discussion.

@sbillinge
Copy link
Contributor

on second thoughts, since this is working I will just merge it, but I think a bit of a refactoring would be nice in the next release. The functions are not well abstracted and so not well reused across our different sub-packages (the babies) which are all doing many of the same things. (pulling repo's reading files etc.). I think our design could be quite a bit better in a do-over.

@sbillinge sbillinge merged commit 18b5e03 into diffpy:migration Jul 6, 2025
4 checks passed
@dabeycorn
Copy link
Author

dabeycorn commented Jul 6, 2025

I understand. I can take a crack at rewriting the tests later. For now, I'll just continue the migration process.

@dabeycorn dabeycorn deleted the fix-tests branch July 6, 2025 02:18
@sbillinge
Copy link
Contributor

I understand. I can take a crack at rewriting the tests later. For now, I'll just continue the migration process.

please post an issue now to do it later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants